Toward Worker-Centric Crowdsourcing
نویسندگان
چکیده
Today, crowdsourcing is used to “taskify” any job ranging from simple receipt transcription to collaborative editing, fan-subbing, and citizen science. Existing work has mainly focused on improving the processes of task assignment and task completion in a requester-centric way by optimizing for outcome quality under budget constraints. In this paper, we advocate that accounting for workers’ characteristics, i.e., human factors in task assignment and task completion benefits both workers and requesters, and discuss new opportunities raised by worker-centric crowdsourcing. This survey is based on a tutorial that was given recently at PVLDB [2]. 1 A Case for Worker-Centric Crowdsourcing As more jobs are being “taskified” and executed on crowdsourcing platforms, the role of human workers online is gaining importance. On virtual marketplaces such as Amazon Mechanical Turk, PyBossa and Crowd4U, the crowd is volatile, its arrival and departure asynchronous, and its levels of attention and accuracy diverse. Tasks differ in complexity and necessitate the participation of workers with varying degrees of expertise. As workers continue to get involved in crowdsourcing, a legitimate question is how to improve both their performance and their experience. Existing proposals have been mostly concerned with the development of requester-centric algorithms to match tasks and workers and with preemptive approaches to improve task completion. We believe that new opportunities in developing models and algorithms are yet to be explored in bridging the gap between Social Science studies and Computer Science. Naturally, understanding the characteristics of workers, here referred to as human factors, that directly impact their performance and their experience on the platform, is a necessary step toward achieving that goal. We advocate a re-focus of research in crowdsourcing on how to best leverage human factors at all stages that will widen the scope and impact of crowdsourcing and make it beneficial to both requesters and workers. Several other complementary surveys could be found in the literature [3, 6]. Common tasks such as labeling images or determining the sentiment of a piece of text, can be completed by each worker independently. These types of crowdsourcing tasks are known as micro-tasks. An emerging area of interest is collaborative crowdsourcing where workers complete a task together. Examples include fansubbing, where workers with complementary skills collaborate to generate movie subtitles in various languages, just hours after movies are made available. Disaster reporting is another example where geographically close people with diverse and complementary skills work together to report the aftermath of an earthquake. Section 2 Copyright 2016 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. Bulletin of the IEEE Computer Society Technical Committee on Data Engineering
منابع مشابه
Letter from the Editor - in - Chief
Today, crowdsourcing is used to “taskify” any job ranging from simple receipt transcription to collaborative editing, fan-subbing, and citizen science. Existing work has mainly focused on improving the processes of task assignment and task completion in a requester-centric way by optimizing for outcome quality under budget constraints. In this paper, we advocate that accounting for workers’ cha...
متن کاملEnabling Trust in Crowd Labor Relations through Identity Sharing
While online Crowdsourcing marketplaces provide a powerful avenue for facilitating new forms of informationdriven micro-labor, their practical value is significantly reduced by worker “spam” and employer fraud. We hypothesize anonymity of parties is a major source of these problems, and we thus propose a human-centric solution: encourage employers and workers to voluntarily deanonymize in order...
متن کاملBudgeted Online Assignment in Crowdsourcing Markets: Theory and Practice
We consider the following budgeted online assignment (BOA) problem motivated by crowdsourcing. We are given a set of offline tasks that need to be assigned to workers who come online from the pool of types {1, 2, . . . , n}. For a given time horizon {1, 2, . . . , T}, at each instant of time t, a worker j arrives from the pool in accordance with a known probability distribution [pjt] such that ...
متن کاملMotivation-Aware Task Assignment in Crowdsourcing
We investigate how to leverage the notion of motivation in assigning tasks to workers and improving the performance of a crowdsourcing system. In particular, we propose to model motivation as the balance between task diversity–i.e., the difference in skills among the tasks to complete, and task payment–i.e., the difference between how much a chosen task offers to pay and how much other availabl...
متن کاملIncentives and Truthful Reporting in Consensus-centric Crowdsourcing
We address the challenge in crowdsourcing systems of incentivizing people to contribute to the best of their abilities. We focus on the class of crowdsourcing tasks where contributions are provided in pursuit of a single correct answer. This class includes citizen science efforts that seek input from people with identifying events and states in the world. We introduce a new payment rule, called...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Data Eng. Bull.
دوره 39 شماره
صفحات -
تاریخ انتشار 2016